UNTITLED

  1. BID DATA AND DEEP LEARNING INTRODUCTION
  2. 1.1 MATLAB AND BIG DATA
  3. 1.1.1 Access Data
  4. 1.1.2 Explore, Process, and Analyze Data
  5. 1.1.3 Develop Predictive Models
  6. 1.2 DEEP LEARNING
  7. 1.2.1 Definitions
  8. 1.2.2 Concepts
  9. 1.2.3 Deep learning and neural networks
  10. 1.2.4 Deep neural networks
  11. 1.2.5 Convolutional neural networks
  12. 1.2.6 Recursive neural networks
  13. 1.2.7 Long short-term memory
  14. 1.2.8 Deep belief networks
  15. 1.2.9 Convolutional deep belief networks
  16. 1.2.10 Large memory storage and retrieval neural networks
  17. 1.2.11 Deep Boltzmann machines
  18. 1.2.12 Encoder–decoder networks
  19. 1.2.13 Deep learning applications
  20. 1.3 DEEP LEARNING WITH MATLAB: NEURAL NETWORK TOOLBOX (DEEP LEARNING TOOLBOX)
  21. 1.4 Using DEEP LEARNING Toolbox
  22. 1.5 Automatic Script Generation
  23. 1.6 DEEP LEARNING Toolbox Applications
  24. 1.7 Neural Network Design Steps
  25. DEEP LEARNING WITH MATLAB: CONVOLUTIONAL Neural NetworkS. FUNCTIONS
  26. 2.2.1 Create an image input layer: imageInputLayer
  27. 2.2.2 Create 2-D convolutional layer: convolution2dLayer
  28. 2.2.3 Create a Rectified Linear Unit (ReLU) layer: reluLayer
  29. 2.2.4 Create a local response normalization layer: crossChannelNormalizationLayer
  30. 2.2.5 Create an average pooling layer: averagePooling2dLayer
  31. 2.2.6 Create max pooling layer: maxPooling2dLayer
  32. 2.2.7 Create fully connected layer: fullyConnectedLayer
  33. 2.2.8 Create a dropout layer: dropoutLayer
  34. 2.2.9 Create a softmax layer softmaxLayer
  35. 2.2.10 Create a classification output layer: classificationLayer
  36. 2.3.1 Train network: trainNetwork
  37. 2.3.2 Options for training neural network. trainingOptions
  38. 2.4 Extract Features and Predict Outcomes. FUNCTIONS
  39. 2.4.1 Compute network layer activations: activations
  40. 2.4.2 Predict responses using a trained network: predict
  41. 2.4.3 Classify data using a trained network: classify
  42. DEEP LEARNING WITH MATLAB: CONVOLUTIONAL Neural NetworkS. CLASSES
  43. 3.2.1 Network layer: layer
  44. 3.3.1 Series network class: SeriesNetwork
  45. 3.3.2 Training options for stochastic gradient descent with momentum. TrainingOptionsSGDM
  46. 3.4.1 ImageInputLayer class
  47. 3.4.2 Convolution2DLayer class
  48. 3.4.3 ReLULayer class
  49. 3.4.4 CrossChannelNormalizationLayer class
  50. 3.4.5 AveragePooling2DLayer class
  51. 3.4.6 MaxPooling2DLayer class
  52. 3.4.7 FullyConnectedLayer class
  53. 3.4.8 DropoutLayer class
  54. 3.4.9 SoftmaxLayer class
  55. 3.4.10 ClassificationOutputLayer class
  56. DEEP LEARNING WITH MATLAB: Image Category Classification
  57. 4.1 Overview
  58. 4.2 Check System Requirements
  59. 4.3 Download Image Data
  60. 4.4 Load Images
  61. 4.5 Download Pre-trained Convolutional Neural Network (CNN)
  62. 4.6 Load Pre-trained CNN
  63. 4.7 Pre-process Images For CNN
  64. 4.8 Prepare Training and Test Image Sets
  65. 4.9 Extract Training Features Using CNN
  66. 4.10 Train A Multiclass SVM Classifier Using CNN Features
  67. 4.11 Evaluate Classifier
  68. 4.12 Try the Newly Trained Classifier on Test Images
  69. 4.13 References
  70. DEEP LEARNING WITH MATLAB: Transfer Learning Using Convolutional Neural Networks AND PRETRAINED Convolutional Neural Networks
  71. 5.1 Transfer Learning Using Convolutional Neural Networks
  72. 5.2 Pretrained Convolutional Neural Network
  73. DEEP LEARNING WITH MATLAB: FunctionS FOR PATTERN RECOGNITION AND CLASSIFICATION. AUTOENCODER
  74. 6.1 INTRODUCTION
  75. 6.2 view NEURAL NETWORK
  76. 6.3 Pattern Recognition and Learning Vector Quantization
  77. 6.3.1 Pattern recognition network: patternnet
  78. 6.3.2 Learning vector quantization neural network: lvqnet
  79. 6.4 Training Options and Network Performance
  80. 6.4.1 Receiver operating characteristic: roc
  81. 6.4.2 Plot receiver operating characteristic: plotroc
  82. 6.4.3 Plot classification confusion matrix: plotconfusion
  83. 6.4.4 Neural network performance: crossentropy
  84. 6.4.5 Construct and Train a Function Fitting Network
  85. 6.4.6 Create and train Feedforward Neural Network
  86. 6.4.7 Create and Train a Cascade Network
  87. 6.5 Network performance
  88. 6.5.1 Description
  89. 6.5.2 Examples
  90. 6.6 Fit Regression Model and Plot Fitted Values versus Targets
  91. 6.6.1 Description
  92. 6.6.2 Examples
  93. 6.7 Plot Output and Target Values
  94. 6.7.1 Description
  95. 6.7.2 Examples
  96. 6.8 Plot Training State Values
  97. 6.9 Plot Performances
  98. 6.10 Plot Histogram of Error Values
  99. 6.10.1 Syntax
  100. 6.10.2 Description
  101. 6.10.3 Examples
  102. 6.11 Generate MATLAB function for simulating neural network
  103. 6.11.1 Create Functions from Static Neural Network
  104. 6.11.2 Create Functions from Dynamic Neural Network
  105. 6.12 A COMPLETE EXAMPLE: House Price Estimation
  106. 6.12.1 The Problem: Estimate House Values
  107. 6.12.2 Why Neural Networks?
  108. 6.12.3 Preparing the Data
  109. 6.12.4 Fitting a Function with a Neural Network
  110. 6.12.5 Testing the Neural Network
  111. 6.13 Autoencoder class
  112. 6.14 Autoencoder FUNCTIONS
  113. 6.14.1 Functions
  114. 6.14.2 trainAutoencoder
  115. 6.14.3 decode
  116. 6.14.4 encode
  117. 6.14.5 predict
  118. 6.14.6 stack
  119. 6.14.7 generateFunction
  120. 6.14.8 generateSimulink
  121. 6.14.9 plotWeights
  122. 6.14.10 view
  123. 6.15 Construct Deep Network Using Autoencoders
  124. DEEP LEARNING WITH MATLAB: MULTILAYER Neural Network
  125. 7.1 Create, Configure, and Initialize Multilayer Neural Networks
  126. 7.1.1 Other Related Architectures
  127. 7.2 FUNCTIONS FOR Create, Configure, and Initialize Multilayer Neural Networks
  128. 7.2.1 Initializing Weights (init)
  129. 7.2.2 feedforwardnet
  130. 7.2.3 configure
  131. 7.2.4 init
  132. 7.2.5 train
  133. 7.2.6 trainlm
  134. 7.2.7 tansig
  135. 7.2.8 purelin
  136. 7.2.9 cascadeforwardnet
  137. 7.2.10 patternnet
  138. 7.3 Train and Apply Multilayer Neural Networks
  139. 7.3.1 Training Algorithms
  140. 7.3.2 Training Example
  141. 7.3.3 Use the Network
  142. 7.4 Train ALGORITMS IN Multilayer Neural Networks
  143. 7.4.1 trainbr:Bayesian Regularization
  144. 7.4.2 trainscg: Scaled conjugate gradient backpropagation
  145. 7.4.3 trainrp: Resilient backpropagation
  146. 7.4.4 trainbfg: BFGS quasi-Newton backpropagation
  147. 7.4.5 traincgb: Conjugate gradient backpropagation with Powell-Beale restarts
  148. 7.4.6 traincgf: Conjugate gradient backpropagation with Fletcher-Reeves updates
  149. 7.4.7 traincgp: Conjugate gradient backpropagation with Polak-Ribiére updates
  150. 7.4.8 trainoss: One-step secant backpropagation
  151. 7.4.9 traingdx: Gradient descent with momentum and adaptive learning rate backpropagation
  152. 7.4.10 traingdm: Gradient descent with momentum backpropagation
  153. 7.4.11 traingd: Gradient descent backpropagation
  154. DEEP LEARNING WITH MATLAB: ANALYZE AND DEPLOY TRAINED NEURAL NETWORK
  155. 8.1 ANALYZE NEURAL NETWORK PERFORMANCE
  156. 8.2 Improving Results
  157. 8.3 Deployment Functions and Tools for Trained Networks
  158. 8.4 Generate Neural Network Functions for Application Deployment
  159. 8.5 Deploy Neural Network Simulink Diagrams
  160. 8.5.1 Example
  161. 8.5.2 Suggested Exercises
  162. 8.6 Deploy Training of Neural Networks
  163. TRAINING SCALABILITY AND EFICIENCE
  164. 9.1 Neural Networks with Parallel and GPU Computing
  165. 9.1.1 Modes of Parallelism
  166. 9.1.2 Distributed Computing
  167. 9.1.3 Single GPU Computing
  168. 9.1.4 Distributed GPU Computing
  169. 9.1.5 Deep Learning
  170. 9.1.6 Parallel Time Series
  171. 9.1.7 Parallel Availability, Fallbacks, and Feedback
  172. 9.2 Automatically Save Checkpoints During Neural Network Training
  173. 9.3 Optimize Neural Network Training Speed and Memory
  174. 9.3.1 Memory Reduction
  175. 9.3.2 Fast Elliot Sigmoid
  176. DEEP LEARNING WITH MATLAB: OPTIMAL SOLUTIONS
  177. 10.1 Representing Unknown or Don’t-Care Targets
  178. 10.1.1 Choose Neural Network Input-Output Processing Functions
  179. 10.1.2 Representing Unknown or Don’t-Care Targets
  180. 10.2 Configure Neural Network Inputs and Outputs
  181. 10.3 Divide Data for Optimal Neural Network Training
  182. 10.4 Choose a Multilayer Neural Network Training Function
  183. 10.4.1 SIN Data Set
  184. 10.4.2 PARITY Data Set
  185. 10.4.3 ENGINE Data Set
  186. 10.4.4 CANCER Data Set
  187. 10.4.5 CHOLESTEROL Data Set
  188. 10.4.6 DIABETES Data Set
  189. 10.4.7 Summary
  190. 10.5 Improve Neural Network Generalization and Avoid Overfitting
  191. 10.5.1 Retraining Neural Networks
  192. 10.5.2 Multiple Neural Networks
  193. 10.5.3 Early Stopping
  194. 10.5.4 Index Data Division (divideind)
  195. 10.5.5 Random Data Division (dividerand)
  196. 10.5.6 Block Data Division (divideblock)
  197. 10.5.7 Interleaved Data Division (divideint)
  198. 10.5.8 Regularization
  199. 10.5.9 Modified Performance Function
  200. 10.5.10 Automated Regularization (trainbr)
  201. 10.5.11 Summary and Discussion of Early Stopping and Regularization
  202. 10.5.12 Posttraining Analysis (regression)
  203. 10.6 Train Neural Networks with Error Weights
  204. 10.7 Normalize Errors of Multiple Outputs
  205. DEEP LEARNING WITH MATLAB: CLASSIFICATION WITH NEURAL NETWORKS. EXAMPLES
  206. 11.1 Crab Classification
  207. 11.1.1 Why Neural Networks?
  208. 11.1.2 Preparing the Data
  209. 11.1.3 Building the Neural Network Classifier
  210. 11.1.4 Testing the Classifier
  211. 11.2 Wine Classification
  212. 11.2.1 The Problem: Classify Wines
  213. 11.2.2 Why Neural Networks?
  214. 11.2.3 Preparing the Data
  215. 11.2.4 Pattern Recognition with a Neural Network
  216. 11.2.5 Testing the Neural Network
  217. 11.3 Cancer Detection
  218. 11.3.1 Formatting the Data
  219. 11.3.2 Ranking Key Features
  220. 11.3.3 Classification Using a Feed Forward Neural Network
  221. 11.4 Character Recognition
  222. 11.4.1 Creating the First Neural Network
  223. 11.4.2 Training the first Neural Network
  224. 11.4.3 Training the Second Neural Network
  225. 11.4.4 Testing Both Neural Networks
  226. DEEP LEARNING WITH MATLAB: AUTOENCODERS AND CLUSTERING WITH NEURAL NETWORKS. EXAMPLES
  227. 12.1 Train Stacked Autoencoders for Image Classification
  228. 12.1.1 Data set
  229. 12.1.2 Training the first autoencoder
  230. 12.1.3 Visualizing the weights of the first autoencoder
  231. 12.1.4 Training the second autoencoder
  232. 12.1.5 Training the final softmax layer
  233. 12.1.6 Forming a stacked neural network
  234. 12.1.7 Fine tuning the deep neural network
  235. 12.1.8 Summary
  236. 12.2 Transfer Learning Using Convolutional Neural Networks
  237. 12.3 Iris Clustering
  238. 12.3.1 Why Self-Organizing Map Neural Networks?
  239. 12.3.2 Preparing the Data
  240. 12.3.3 Clustering with a Neural Network
  241. 12.4 Gene Expression Analysis
  242. 12.4.1 The Problem: Analyzing Gene Expressions in Baker’s Yeast (Saccharomyces Cerevisiae)
  243. 12.4.2 The Data
  244. 12.4.3 Filtering the Genes
  245. 12.4.4 Principal Component Analysis
  246. 12.4.5 Cluster Analysis: Self-Organizing Maps
  1. Title Page